A Scale Mixture Perspective of Multiplicative Noise in Neural Networks
نویسنده
چکیده
Corrupting the input and hidden layers of deep neural networks (DNNs) with multiplicative noise, often drawn from the Bernoulli distribution (or ‘dropout’), provides regularization that has significantly contributed to deep learning’s success. However, understanding how multiplicative corruptions prevent overfitting has been difficult due to the complexity of a DNN’s functional form. In this paper, we show that when a Gaussian prior is placed on a DNN’s weights, applying multiplicative noise induces a Gaussian scale mixture, which can be reparameterized to circumvent the problematic likelihood function. Analysis can then proceed by using a type-II maximum likelihood procedure to derive a closed-form expression revealing how regularization evolves as a function of the network’s weights. Results show that multiplicative noise forces weights to become either sparse or invariant to rescaling. We find our analysis has implications for model compression as it naturally reveals a weight pruning rule that starkly contrasts with the commonly used signal-to-noise ratio (SNR). While the SNR prunes weights with large variances, seeing them as noisy, our approach recognizes their robustness and retains them. We empirically demonstrate our approach has a strong advantage over the SNR heuristic and is competitive to retraining with soft targets produced from a teacher model.
منابع مشابه
Prediction of monthly rainfall using artificial neural network mixture approach, Case Study: Torbat-e Heydariyeh
Rainfall is one of the most important elements of water cycle used in evaluating climate conditions of each region. Long-term forecast of rainfall for arid and semi-arid regions is very important for managing and planning of water resources. To forecast appropriately, accurate data regarding humidity, temperature, pressure, wind speed etc. is required.This article is analytical and its database...
متن کاملSwitching Dynamics of Neural Systems in the Presence of Multiplicative Colored Noise
We study the dynamics of a simple bistable system driven by multiplicative correlated noise. Such system mimics the dynamics of classical attractor neural networks with an additional source of noise associated, for instance, with the stochasticity of synaptic transmission. We found that the multiplicative noise, which performs as a fluctuating barrier separating the stable solutions, strongly i...
متن کاملThe Application of Multi-Layer Artificial Neural Networks in Speckle Reduction (Methodology)
Optical Coherence Tomography (OCT) uses the spatial and temporal coherence properties of optical waves backscattered from a tissue sample to form an image. An inherent characteristic of coherent imaging is the presence of speckle noise. In this study we use a new ensemble framework which is a combination of several Multi-Layer Perceptron (MLP) neural networks to denoise OCT images. The noise is...
متن کاملNovel Radial Basis Function Neural Networks based on Probabilistic Evolutionary and Gaussian Mixture Model for Satellites Optimum Selection
In this study, two novel learning algorithms have been applied on Radial Basis Function Neural Network (RBFNN) to approximate the functions with high non-linear order. The Probabilistic Evolutionary (PE) and Gaussian Mixture Model (GMM) techniques are proposed to significantly minimize the error functions. The main idea is concerning the various strategies to optimize the procedure of Gradient ...
متن کاملMultiplicative Normalizing Flows for Variational Bayesian Neural Networks
We reinterpret multiplicative noise in neural networks as auxiliary random variables that augment the approximate posterior in a variational setting for Bayesian neural networks. We show that through this interpretation it is both efficient and straightforward to improve the approximation by employing normalizing flows (Rezende & Mohamed, 2015) while still allowing for local reparametrizations ...
متن کامل